Search results

1 – 3 of 3
Article
Publication date: 1 September 2001

Magnus Lif, Eva Olsson, Jan Gulliksen and Bengt Sandblad

Traditional process‐oriented system development methods often result in fragmentary user interfaces with information presented in various windows without considerations of…

Abstract

Traditional process‐oriented system development methods often result in fragmentary user interfaces with information presented in various windows without considerations of requirements for simultaneous viewing. Opening, closing, moving and resizing these windows attracts the users’ attention away from the actual work. User interface design according to the workspace metaphor could provide skilled professional users with an efficient, customised user interface to administrative information systems. This can improve work performance and facilitate efficient navigation between workspaces. A case study in co‐operation with the Swedish National Tax Board (RSV) describes practical use of the workspace metaphor.

Details

Information Technology & People, vol. 14 no. 3
Type: Research Article
ISSN: 0959-3845

Keywords

Content available
Article
Publication date: 27 February 2007

364

Abstract

Details

Disaster Prevention and Management: An International Journal, vol. 16 no. 1
Type: Research Article
ISSN: 0965-3562

Article
Publication date: 3 April 2018

Matthias von Davier

Surveys that include skill measures may suffer from additional sources of error compared to those containing questionnaires alone. Examples are distractions such as noise or…

Abstract

Purpose

Surveys that include skill measures may suffer from additional sources of error compared to those containing questionnaires alone. Examples are distractions such as noise or interruptions of testing sessions, as well as fatigue or lack of motivation to succeed. This paper aims to provide a review of statistical tools based on latent variable modeling approaches extended by explanatory variables that allow detection of survey errors in skill surveys.

Design/methodology/approach

This paper reviews psychometric methods for detecting sources of error in cognitive assessments and questionnaires. Aside from traditional item responses, new sources of data in computer-based assessment are available – timing data from the Programme for the International Assessment of Adult Competencies (PIAAC) and data from questionnaires – to help detect survey errors.

Findings

Some unexpected results are reported. Respondents who tend to use response sets have lower expected values on PIAAC literacy scales, even after controlling for scores on the skill-use scale that was used to derive the response tendency.

Originality/value

The use of new sources of data, such as timing and log-file or process data information, provides new avenues to detect response errors. It demonstrates that large data collections need to better utilize available information and that integration of assessment, modeling and substantive theory needs to be taken more seriously.

Details

Quality Assurance in Education, vol. 26 no. 2
Type: Research Article
ISSN: 0968-4883

Keywords

1 – 3 of 3